AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient attention

# Efficient attention

Ruropebert E5 Base 2k
A Russian sentence encoder model based on the RoPEBert architecture, supporting a context length of 2048 tokens and excelling in the encodechka benchmark tests.
Text Embedding Transformers Other
R
Tochka-AI
2,422
11
Long T5 Local Large
Apache-2.0
Long T5 is a text-to-text Transformer model extended from T5, which supports efficient processing of long sequence inputs and is particularly suitable for text generation tasks.
Large Language Model Transformers English
L
google
177
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase